83 research outputs found

    Reproduzierbarkeit - Viele kleine Schritte zu einem großen Ziel

    Get PDF
    Reproduzierbarkeit von wissenschaftlichen Ergebnissen ist ein Grundpfeiler für die Vertrauenswürdigkeit von Forschung. Daher spielt das Thema auch in der öffentlichen Wahrnehmung eine zunehmende Rolle. Forschungsdaten und -software sind in diesem Kontext eng miteinander verbunden. Die für Forschungsdaten entwickelten FAIR Prinzipien können mit einigen Anpassungen auf Forschungssoftware übertragen werden und definieren auch im Zusammenhang mit Reproduzierbarkeit wichtige Voraussetzung, um Ergebnisse nachvollziehen zu können. Viele schon bestehende lokale und themenspezifische Initiativen wollen den kulturellen Wandel in der Wissenschaft hin zu Open Science vorantreiben und damit die Qualität der wissenschaftlichen Arbeit und die Robustheit von Ergebnisse erhöhen. Im Vortrag wird die Rolle des German Reproducibility Network GRN als Plattform für die Vernetzung solcher Gruppen vorgestellt, damit verschiedene wissenschaftliche Communities voneinander lernen und damit Schritt für Schritt auf dem Weg zu mehr Reproduzierbarkeit gehen können

    Research software and the people behind it

    Get PDF
    Research software has been part of scientific work for many years. But their development has been hidden for a long time and has often not been adequately appreciated. Research software has only recently become the focus of broader interest and its importance for the quality of research is succinctly formulated by the slogan "Better software - better research". For all those who develop software in the research environment and for research use, the term research software engineer RSE has become common for some time. The community of RSEs has developed as a grassroots movement, which also has a national chapter in Germany with the de-RSE e.V. The association and its activities will be presented in the talk

    Research software landscape and stakeholders

    Get PDF
    Software plays a crucial role in the whole lifecycle of most of scientific data and must be considered in all discussions about openness of data and reproducibility of science. Consequently, some interest and working groups in RDA are dealing with software code. The spectrum of research software ranges from packages driven by large teams of developers and used by a broad community, to small scripts that scientists almost casually write for their own work. The knowledge of modern software development practice is very different, as well as individual skills in programming and documentation. The quality levels are often correspondingly different. In order to reduce such differences in the medium term, general policies can be helpful. Policies can provide a framework for a defined way of dealing with software. In Germany, some coordinated activities are delving into this topic. For example in the Helmholtz Association, the working group Open Science expanded its perspective from initially only data, and has now set up a Taskgroup. Research software was also anchored as an issue in the Priority Initiative Digital Information of the Alliance of Science Organisations in Germany. The panels have already published some recommendations on the development, use, and provision of research software. At present, they are working on guidelines that can serve as a basis for daily work at the institutions. The documents cover different facets, ranging from development practice and quality assurance to publication of software and licensing. The talk will give an overview about the status of these activities. It will also shed light on the players in the software development. Starting from the UK, people are increasingly organizing themselves, working at the interface between science and computer science. In Germany too, there has been an association de-RSE since 2018, which campaigns for the interests of research software engineers

    Workflow Treatment in C3Grid

    Get PDF
    The Collaborative Climate Community Data and Processing Grid (C3Grid) offers distributed processing ressources which can be used to run several workflows. The spectrum of workflows contains tools for diagnostic and analysis of data (like cyclone tracking and weather type analysis)and preparational workflows for regional models. The talk gives a short introduction into workflow treatment in C3Grid and compares this approach with workflow treatment via WPS

    Hidden Figures - wo sind sie? Frauen in der Entwicklung von Forschungssoftware

    Get PDF
    Der Film "Hidden Figures" basiert auf historischen Tatsachen und zeigte 2017 einem breiten Publikum, dass bereits bei der Einführung von Computern bei der NASA in den 50er Jahren Frauen an deren Programmierung beteiligt waren, deren Leistung aber weitgehend unbekannt blieb. Heute blicken wir auf eine immer weiter erstarkende Community der Research Software Engineers und fragen nach Diversität und nach Sichtbarkeit von Frauen bei der Entwicklung von Forschungssoftware

    6th Data Science Symposium Abstracts

    Get PDF
    The Data Science Symposium at Haus der Wissenschaft on 8/9 November 2021 in Bremen was the 6th Symposium in this series since 2017

    Forschungssoftware im Kontext von guter wissenschaftlicher Praxis, Open Science und der Helmholtz Digitalisierungsstrategie

    Get PDF
    Research software plays a crucial role in scientific working process. The talk relates the topic to the new rules for good scientific practice and the current developments in Open Science. In addition, the importance of research software in the context of the digitization strategy of Helmholtz association is considered

    Newsletter of the Digital Earth Project Contributions of the Alfred Wegener Institute to Digital Earth

    Get PDF
    As an important technical pillar of Digital Earth AWI computing centre provides data management and cloud processing services to the project partners. We develop project specific extensions to the AWI data flow framework O2A (Observation to Archive). Sensor registration in O2A will support a flexible handling of sensors and their metadata, e.g. for the Digital Earth showcases, methane and soil moisture measurements are in focus for smart monitoring designs and for the access to data in near real time (NRT). Furthermore, data exploration is supported by a rasterdata manager service that can be easily coupled in user ́s data workflows with other data sources, like NRT sensor data. In the following we give more details on O2A, its components and concept

    Automatic data quality control for understanding extreme climate event

    Get PDF
    The understanding of extreme events strongly depends on knowledge gained from data. Data integration of mul-tiple sources, scales and earth compartments is the fo-cus of the project Digital Earth, which also join efforts on the quality control of data. Automatic quality control is embedded in the ingest component of the O2A, the ob-servation-to-archive data flow framework of the Alfred-Wegener-Institute. In that framework, the O2A-Sensor provides observation properties to the O2A-Ingest, which delivers quality-flagged data to the O2A-dash-board. The automatic quality control currently follows a procedural approach, where modules are included to implement formulations found in the literature and other operational observatory networks. A set of plausibility tests including range, spike and gradient tests are cur-rently operational. The automatic quality control scans the ingesting data in near-real-time (NRT) format, builds a table of devices, and search - either by absolute or derivative values - for correctness and validity of obser-vations. The availability of observation properties, for in-stance tests parameters like physical or operation ranges, triggers the automatic quality control, which in turn iterates through the table of devices to set the qual-ity flag for each sample and observation. To date, the quality flags in use are sequential and qualitative, i.e. it describes a level of quality in the data. A new flagging system is under development to include a descriptive characteristic that will comprise technical and user inter-pretation. Within Digital Earth, data on flood and drought events along the Elbe River and methane emissions in the North Sea are to be reviewed using automatic qual-ity control. Fast and scalable automatic quality control will disentangle uncertainty raised by quality issues and thus improve our understanding of extreme events in those cases
    corecore